Overall Framework Overview
Our evaluation framework translates vision into action through a structured yet adaptable approach. This framework comprises three core components-Container, Playground, and Instruments-each designed to harmonize universal principles with contextual specificity.
Container / Playground / Instruments Model
Container
The Container defines the logistical boundaries within which evaluation unfolds. These elements remain consistent across contexts, though their specific expressions adapt to local realities:
-
When (Timing of Data Collection):
-
Responsive: Triggered by regular programming activities.
-
Periodic: Scheduled at regular intervals for continuous insight.
-
Emergent: Activated by unforeseen events or shifts.
-
-
Which (Data Collection Touchpoints):
-
Verbose Input: Narrative reflections, qualitative feedback.
-
Point Systems: Quantitative rankings, ratings.
-
Programmatic Data: Code submissions, public chat histories.
-
-
Who (Value-Generating Entities):
-
Individuals: Sovereign agents contributing personal insights and actions.
-
Communities: Stakeholder groups collectively generating value.
-
Systems: Broader processes and dynamics understood through ongoing interactions.
-
Together, these logistical parameters form a stable yet flexible container within which evaluation practices are embedded.
Playground
Within this Container lie different Playgrounds-exploratory spaces where stakeholders identify, articulate, and experiment with forms of value generation. The Playground is customizable, shaped by community priorities and evolving needs.
Each Playground has a primary goal that determines its design and by default there are three playgrounds in any Container - the Facilitators, the Cohort and the Place.
Below are some examples of the value elements that build a Playground:
-
What (Forms of Valued Contributions):
-
Activities: Efforts invested in tangible actions and initiatives.
-
Knowledge: Documentation, insights shared from past experiences.
-
Space: Hosting engagements, facilitating connections within networks.
-
-
Why (Purpose Behind Valuation):
-
Regenerative Shifts: Catalyzing systemic transformations toward sustainability.
-
Organizational Growth: Enhancing collective capacities and resilience.
-
Outputs Generated: Tangible results emerging from collaborative efforts.
-
-
How (Visibility of Value Flow):
-
Actions: Observable behaviors captured via direct observation or technology.
-
Reflections: Personal narratives and subjective stakeholder experiences.
-
Attributions: Peer recognition highlighting contributions across dimensions.
-
These variables serve as illustrative templates that communities adapt to reflect their unique contexts and aspirations, fostering playful experimentation and authentic learning journeys.
Instruments
The Instruments operationalize the evaluation process, providing dimensionality and depth through structured stages, technological tools, and analytical lenses:
-
Stages of Delivery:
-
Baselining: Establishing an initial understanding of systemic conditions before intensive engagement begins.
-
Reading: Capturing data dynamically during active programming phases.
-
Inferencing: Synthesizing insights post-intensive to inform future actions.
-
-
Technological Tools Employed:
-
Telegram Bots: Facilitating real-time collection of subjective data from stakeholders.
-
Data Repositories: Centralized platforms (e.g., GitHub repositories, wikis) for structured data aggregation.
-
AI-enabled Analysis & Timelining: Leveraging artificial intelligence to interpret complex multi-capital flows and generate actionable insights.
-
-
Analytical Dimensions Plotted Against:
-
Aspects: Three to seven primary modes through which value generation is expressed and tracked.
-
Capitals: Five to ten resource categories mobilized within the action-learning journey context.
-
Spheres: Realms within systems where evolutionary change is observed and evaluated.
-
These instruments collectively enable nuanced understanding of value generation dynamics, supporting informed decision-making and adaptive learning.
How These Layers Fit Together
In practice, our methodology unfolds iteratively:
Beginning with the Baselining stage, facilitators collaborate closely with stakeholders to map out the current state across five systemic spheres relevant to the community. Together they identify key activities-termed Aspects-that serve as focal points for understanding how value is generated. Stakeholders engage actively with these Aspects throughout the programming cycle, mobilizing diverse forms of capital aligned with their collective goals.
As activities unfold-whether planned or emergent-data collection occurs continuously through accessible technological channels such as Telegram bots, alongside facilitator observations. The data gathered feeds into centralized repositories where AI-driven analysis illuminates multi-capital flows in real-time. These insights are then integrated back into facilitation processes, enhancing stakeholder participation and enabling agile responsiveness to emerging patterns of collective evolution.
Quick-Reference Diagram
text
┌─────────────────── CONTAINER ───────────────────┐ │ WHEN │ WHICH │ WHO │ │ Responsive │ Verbose Input │ Individuals │ │ Periodic │ Point Systems │ Communities │ │ Emergent │ Programmatic │ Systems │ ├─────────────────── PLAYGROUND ───────────────────┤ │ WHAT │ WHY │ HOW │ │ Activities │ Regenerative │ Actions │ │ Knowledge │ Organizational │ Reflections │ │ Space │ Outputs │ Attributions │ ├─────────────────── INSTRUMENTS ──────────────────┤ │ STAGES │ TOOLS │ DIMENSIONS │ │ Baselining │ Telegram Bots │ Aspects │ │ Reading │ Repositories │ Capitals │ │ Inferencing │ AI Analysis │ Spheres │ └──────────────────────────────────────────────────┘
This framework provides a comprehensive yet adaptable structure for evaluation that honors both universal principles and local contexts, enabling communities to track and learn from the flows of multiple forms of capital throughout their regenerative journey.